Thematic Representation in Simple Recurrent Networks
نویسنده
چکیده
Introduction Simple recurrent networks (SRNs) are able to learn and represent lexical classes (Elman, 1990) and grammatical knowledge, such as agreement and argument structure (Elman, 1991), on the basis of co-occurrence regularities embedded in simple and complex sentences. In the present study, we address the question whether SRNs can represent differences in the thematic roles assigned by verbs.
منابع مشابه
Solving Linear Semi-Infinite Programming Problems Using Recurrent Neural Networks
Linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints. In this paper, to solve this problem, we combine a discretization method and a neural network method. By a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem. Then, we use...
متن کاملInstantiating Thematic Roles with a Recurrent Neural Network
NOTE: This report is available both on paper and in electronic form. The electronic version may lack illustrations and special characters. The paper version is available from The University of Georgia. Human languages have both a surface structure and a deep structure. The surface structure of a language can be described by grammatical rules which generate its well-formed sentences. Deep struct...
متن کاملPOMDP 環境中での TD - Network の自動獲得 : 単純再帰構造による拡張 Automatic Acquisition of TD - Network in POMDP Environments : Extension with SRN structure 牧野貴樹
We propose a new neural network architecture, Simple recurrent TD Networks (SR-TDNs), that learns to predict future observations in partially observable environments, using proto-predictive representation of states. SR-TDNs incorporate the structure of simple recurrent neural networks (SRNs) into temporal-difference (TD) networks to use proto-predictive representation of states. Our simulation ...
متن کاملA Comparison of the Classic NetTalk Text-to-Speech System to a Modern, Distributed Representation and Simple Recurrent Network
This paper reports on a comparison to the well-known NetTalk implementation of Engl!sh text-to-speech translation via neural networks. A distributed representation scheme for encoding is investigated opposed to the classic localist representation scheme used in the original NetTalk. The paper discusses a modem re-implementation based on Elman’s Simple Recurrent Network.
متن کاملA Biologically Inspired Connectionist System for Natural Language Processing
Nowadays artificial neural network models often lack many physiological properties of the nervous cell. Feedforward multilayer perceptron architectures, and even simple recurrent networks, still in vogue, are far from those encountered in cerebral cortex. Current learning algorithms are more oriented to computational performance than to biological credibility. The aim of this paper is to propos...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006